Tutorial: Event based logging output

In this notebook we show how to enable and use the event based logging, which is a set of standardised outputs from binary_c for events like supernovae and Roche-lobe overflow episodes.

The events that are available in this version of binary_c-python are listed on the events based logging page in the documentation.

The relevant options for this functionality are prepended with event_based_logging_ (see the population options documentation and the binary_c options documentation)

The events are all flagged with a uuid that allows for matching events that come from the same system. This is useful for example to track the events that preceded the formation of double compact objects, so we can study their evolutionary path.

Enabling event-based logging in binary_c

To enable the output of the certain events, we need to configure one of the following commands:

  • event_based_logging_SN=1: Enables supernova event logging

  • event_based_logging_RLOF=1: Enables RLOF event logging

  • event_based_logging_DCO=1: Enables double compact-object event logging

Enabling event-based log handling in binary_c-python

To enable the automatic processing of the event logs with binary_c-python, we need to set event_based_logging_handle_output=1

This allows binary_c-python to automatically process the output of binary_c and create output files that contain the event logs.

There are several options related to the processing of these output files: - event_based_logging_output_directory: directory where the event based logs will be written to. - event_based_logging_combine_individual_event_files: whether to automatically combine all the process-specific event log files into one combined file. - event_based_logging_combined_events_filename: filename of the combined event file. - event_based_logging_remove_individual_event_files_after_combining: whether to remove the process-specific event log files after combining. - event_based_logging_split_events_file_to_each_type: whether to split the combined event file into event-specific files. - event_based_logging_remove_original_combined_events_file_after_splitting: whether to remove the combined event file after splitting. - event_based_logging_output_separator: separator used for writing the events. - event_based_logging_output_parser: parsing function to handle the output of binary_c. There is a function already present for this, so no need to provide this yourself unless you have special requests. - event_based_logging_parameter_list_dict: dictionary that contains the parameter name list for each specific event (see the events based logging page in the documentation). The current present dictionary is designed to handle the events that are present in this release, but if you add your own events you need to update this dictionary or provide a custom one.

Example usage

We now show some example usage of the event-based logging. We start as usual with some imports and setting up a population object

[1]:
import os
import json

from binarycpython.utils.custom_logging_functions import temp_dir
from binarycpython import Population

TMP_DIR = temp_dir("notebooks", "notebook_events_based_logging", clean_path=True)
EVENT_TYPE_INDEX = 3

data_dir = os.path.join(TMP_DIR, 'data_dir')
os.makedirs(data_dir, exist_ok=True)

event_based_logging_population = Population(tmp_dir=TMP_DIR, verbosity=2)

Lets configure the population object to use events based logging

[2]:
# Set population object
event_based_logging_population.set(
    num_cores=2,
    # data_dir=data_dir,
    # binary-c options related to event-based logging
    event_based_logging_SN=1,
    event_based_logging_RLOF=1,
    event_based_logging_DCO=1,
    # binary_c-python options related to event-based logging
    event_based_logging_handle_output=True,
    event_based_logging_output_directory=os.path.join(TMP_DIR, 'events'),
    event_based_logging_combine_individual_event_files=True,
    event_based_logging_remove_individual_event_files_after_combining=False,
    event_based_logging_split_events_file_to_each_type=False,
    event_based_logging_remove_original_combined_events_file_after_splitting=False,
)
adding: num_cores=2 to population_options
adding: event_based_logging_SN=1 to BSE_options by catching the %d
adding: event_based_logging_RLOF=1 to BSE_options by catching the %d
adding: event_based_logging_DCO=1 to BSE_options by catching the %d
adding: event_based_logging_handle_output=True to population_options
adding: event_based_logging_output_directory=/tmp/binary_c_python-david/notebooks/notebook_events_based_logging/events to population_options
adding: event_based_logging_combine_individual_event_files=True to population_options
adding: event_based_logging_remove_individual_event_files_after_combining=False to population_options
adding: event_based_logging_split_events_file_to_each_type=False to population_options
adding: event_based_logging_remove_original_combined_events_file_after_splitting=False to population_options

And lets provide some systems that can generate us some events. We use a set list of systems through the source-file sampling functionality but that is only for this example. For a more serious sampling you can use e.g. the grid-based sampling (see grid-based sampling notebook)

[3]:
# Configure the source-file sampling
system_dict_test_list = [
    {"M_1": 10},
    {"M_1": 10.0, "M_2": 0.1, "orbital_period": 1000000000},
    {"M_1": 1, "M_2": 0.5, "orbital_period": 100.0},
]

# Create file that contains the systems
source_file_sampling_filename = os.path.join(
    TMP_DIR, "source_file_sampling_filename.txt"
)

# write the source file
with open(source_file_sampling_filename, "w") as f:
    # Loop over system dict
    for system_dict_test_entry in system_dict_test_list:
        argline = " ".join(
            [
                "{} {}".format(key, val)
                for key, val in system_dict_test_entry.items()
            ]
        )
        f.write(argline + "\n")

# Update setting
event_based_logging_population.set(
    source_file_sampling_type="command",
    source_file_sampling_filename=source_file_sampling_filename,
    evolution_type="source_file"
)
adding: source_file_sampling_type=command to population_options
adding: source_file_sampling_filename=/tmp/binary_c_python-david/notebooks/notebook_events_based_logging/source_file_sampling_filename.txt to population_options
adding: evolution_type=source_file to population_options

Lets now run these systems and explore the contents of the event files.

[4]:
event_based_logging_population.evolve()
Warning: No parse function set. Make sure you intended to do this.
setting up the system_queue_filler now
Loading source file from /tmp/binary_c_python-david/notebooks/notebook_events_based_logging/source_file_sampling_filename.txt
Source file loaded
Signalling processes to stop

****************************************************
*                Process 1 finished:               *
*  generator started at 2023-05-19T11:46:05.171515 *
* generator finished at 2023-05-19T11:46:05.304666 *
*                   total: 0.13s                   *
*           of which 0.07s with binary_c           *
*                   Ran 1 systems                  *
*           with a total probability of 1          *
*         This thread had 0 failing systems        *
*       with a total failed probability of 0       *
*   Skipped a total of 0 zero-probability systems  *
*                                                  *
****************************************************


****************************************************
*                Process 0 finished:               *
*  generator started at 2023-05-19T11:46:05.167359 *
* generator finished at 2023-05-19T11:46:05.363291 *
*                   total: 0.20s                   *
*           of which 0.14s with binary_c           *
*                   Ran 2 systems                  *
*           with a total probability of 2          *
*         This thread had 0 failing systems        *
*       with a total failed probability of 0       *
*   Skipped a total of 0 zero-probability systems  *
*                                                  *
****************************************************


**********************************************************
*  Population-4d2bf0d253dc4fce92d16ec4f79f1d58 finished! *
*               The total probability is 3.              *
*  It took a total of 0.59s to run 3 systems on 2 cores  *
*                   = 1.19s of CPU time.                 *
*              Maximum memory use 337.516 MB             *
**********************************************************

No failed systems were found in this run.
[4]:
{'population_id': '4d2bf0d253dc4fce92d16ec4f79f1d58',
 'evolution_type': 'source_file',
 'failed_count': 0,
 'failed_prob': 0,
 'failed_systems_error_codes': [],
 'errors_exceeded': False,
 'errors_found': False,
 'total_probability': 3,
 'total_count': 3,
 'start_timestamp': 1684493165.101546,
 'end_timestamp': 1684493165.6958666,
 'time_elapsed': 0.59432053565979,
 'total_mass_run': 21.6,
 'total_probability_weighted_mass_run': 21.6,
 'zero_prob_stars_skipped': 0}
[5]:
combined_events_filename = os.path.join(
    event_based_logging_population.population_options['event_based_logging_output_directory'],
    event_based_logging_population.population_options['event_based_logging_combined_events_filename']
)
print(combined_events_filename)
with open(combined_events_filename, 'r') as f:
    combined_events = f.readlines()

for event in combined_events:
    print(event)
/tmp/binary_c_python-david/notebooks/notebook_events_based_logging/events/all_events.dat
4FACA930-D60C-45B2-876C-8FDC4E8464F5    1       0       SN_BINARY       10      0.1     1e+09   9.09806e+06     0       2.848420259305e+01      0.02    59335   1.33487 13      14      0       0       -1      3.23995e+06     -1.67576        9.18507 5       727.198 2.81957 1.82729 2.81957 0.00534839      0       3.23834e+06     9.89406e+06     0.1     0.134597        0       0       1       404.39  4.31965 -0.210048

6CF84E95-DE5F-4884-927D-CD07208DED8F    1       0       SN_SINGLE       10      2.848380621878e+01      0.02    22227   1.33469 13      14      0       0       9.1865  5       724.338 2.81957 1.82533 2.81957 0.00539487      0       1       438.865 2.75425 0.306326

BE487565-5C4B-4DB8-9D67-61E1E2436CA1    1       0       RLOF    1       0.5     100     103.802 0       1.231494440429e+04      0.02    21341   0.970542        0.50131 41.8386 0.468712        95.7159 0.244766        3       0       77741.6 3       1       0       1.231494317672e+10      0       0.337181        0.50131 0.016932        0.468712        2.79234 0.00161588      10      0       6111.9  3       0       1       1.231494440429e+10      0       0.633361        0       0.50131 0       0.633361        202.734 1

As we see above, we now have some events that we can analyse.

We can parse the contents of each of the events with the event_based_logging_parameter_list_dict.

[6]:
def recast_values(event_dict):
    """
    Function to recast the values from strings to number values
    """

    for key in event_dict.keys():
        if key in ['uuid', 'event_type']:
           continue

        try:
            if '.' in event_dict[key]:
                event_dict[key] = float(event_dict[key])
            else:
                event_dict[key] = int(event_dict[key])
        except:
            event_dict[key] = float(event_dict[key])

    return event_dict

def parse_events(events_list, parsing_dict):
    """
    Function to parse the output of the evolution of the system and create a dictionary containing the
    """

    parsed_events_list = []

    # Loop over output
    for event in events_list:
        split_event = event.split()

        # Parse output and create dictionary
        event_type = split_event[EVENT_TYPE_INDEX]
        parameter_names = parsing_dict[event_type]
        event_dict = {parameter_name: parameter_value for (parameter_name, parameter_value) in zip(parameter_names, split_event)}

        # recast values
        event_dict = recast_values(event_dict=event_dict)

        #
        parsed_events_list.append(event_dict)

    return parsed_events_list

event_based_logging_parameter_list_dict = event_based_logging_population.population_options['event_based_logging_parameter_list_dict']

parsed_events = parse_events(events_list=combined_events, parsing_dict=event_based_logging_parameter_list_dict)

for parsed_event in parsed_events:
    print(json.dumps(parsed_event, indent=4))

{
    "uuid": "4FACA930-D60C-45B2-876C-8FDC4E8464F5",
    "probability": 1,
    "event_number": 0,
    "event_type": "SN_BINARY",
    "zams_mass_1": 10,
    "zams_mass_2": 0.1,
    "zams_orbital_period": 1000000000.0,
    "zams_separation": 9098060.0,
    "zams_eccentricity": 0,
    "time": 28.48420259305,
    "metallicity": 0.02,
    "random_seed": 59335,
    "SN_post_SN_mass": 1.33487,
    "SN_post_SN_stellar_type": 13,
    "SN_type": 14,
    "SN_fallback_fraction": 0,
    "SN_fallback_mass": 0,
    "SN_post_SN_ecc": -1,
    "SN_post_SN_orbital_period": 3239950.0,
    "SN_post_SN_separation": -1.67576,
    "SN_pre_SN_mass": 9.18507,
    "SN_pre_SN_stellar_type": 5,
    "SN_pre_SN_radius": 727.198,
    "SN_pre_SN_core_mass": 2.81957,
    "SN_pre_SN_CO_core_mass": 1.82729,
    "SN_pre_SN_He_core_mass": 2.81957,
    "SN_pre_SN_fraction_omega_crit": 0.00534839,
    "SN_pre_SN_ecc": 0,
    "SN_pre_SN_orbital_period": 3238340.0,
    "SN_pre_SN_separation": 9894060.0,
    "SN_pre_SN_companion_mass": 0.1,
    "SN_pre_SN_companion_radius": 0.134597,
    "SN_pre_SN_companion_stellar_type": 0,
    "SN_starnum": 0,
    "SN_counter": 1,
    "SN_kick_v": 404.39,
    "SN_kick_omega": 4.31965,
    "SN_kick_phi": -0.210048
}
{
    "uuid": "6CF84E95-DE5F-4884-927D-CD07208DED8F",
    "probability": 1,
    "event_number": 0,
    "event_type": "SN_SINGLE",
    "zams_mass_1": 10,
    "time": 28.48380621878,
    "metallicity": 0.02,
    "random_seed": 22227,
    "SN_post_SN_mass": 1.33469,
    "SN_post_SN_stellar_type": 13,
    "SN_type": 14,
    "SN_fallback_fraction": 0,
    "SN_fallback_mass": 0,
    "SN_pre_SN_mass": 9.1865,
    "SN_pre_SN_stellar_type": 5,
    "SN_pre_SN_radius": 724.338,
    "SN_pre_SN_core_mass": 2.81957,
    "SN_pre_SN_CO_core_mass": 1.82533,
    "SN_pre_SN_He_core_mass": 2.81957,
    "SN_pre_SN_fraction_omega_crit": 0.00539487,
    "SN_starnum": 0,
    "SN_counter": 1,
    "SN_kick_v": 438.865,
    "SN_kick_omega": 2.75425,
    "SN_kick_phi": 0.306326
}
{
    "uuid": "BE487565-5C4B-4DB8-9D67-61E1E2436CA1",
    "probability": 1,
    "event_number": 0,
    "event_type": "RLOF",
    "zams_mass_1": 1,
    "zams_mass_2": 0.5,
    "zams_orbital_period": 100,
    "zams_separation": 103.802,
    "zams_eccentricity": 0,
    "time": 12314.94440429,
    "metallicity": 0.02,
    "random_seed": 21341,
    "RLOF_initial_mass_accretor": 0.970542,
    "RLOF_initial_mass_donor": 0.50131,
    "RLOF_initial_radius_accretor": 41.8386,
    "RLOF_initial_radius_donor": 0.468712,
    "RLOF_initial_separation": 95.7159,
    "RLOF_initial_orbital_period": 0.244766,
    "RLOF_initial_stellar_type_accretor": 3,
    "RLOF_initial_stellar_type_donor": 0,
    "RLOF_initial_orbital_angular_momentum": 77741.6,
    "RLOF_initial_stability": 3,
    "RLOF_initial_starnum_accretor": 1,
    "RLOF_initial_starnum_donor": 0,
    "RLOF_initial_time": 12314943176.72,
    "RLOF_initial_disk": 0,
    "RLOF_final_mass_accretor": 0.337181,
    "RLOF_final_mass_donor": 0.50131,
    "RLOF_final_radius_accretor": 0.016932,
    "RLOF_final_radius_donor": 0.468712,
    "RLOF_final_separation": 2.79234,
    "RLOF_final_orbital_period": 0.00161588,
    "RLOF_final_stellar_type_accretor": 10,
    "RLOF_final_stellar_type_donor": 0,
    "RLOF_final_orbital_angular_momentum": 6111.9,
    "RLOF_final_stability": 3,
    "RLOF_final_starnum_accretor": 0,
    "RLOF_final_starnum_donor": 1,
    "RLOF_final_time": 12314944404.29,
    "RLOF_final_disk": 0,
    "RLOF_total_mass_lost": 0.633361,
    "RLOF_total_mass_accreted": 0,
    "RLOF_total_mass_transferred": 0.50131,
    "RLOF_total_mass_lost_from_accretor": 0,
    "RLOF_total_mass_lost_from_common_envelope": 0.633361,
    "RLOF_total_time_spent_masstransfer": 202.734,
    "RLOF_episode_number": 1
}

The parameters contained in each of these dictionaries are described in the events based logging page in the documentation.

The next step would be to run a larger population and log all the events of interest. We can then use the UUID’s to cross match certain different events with each other and perform (complex) queries to select e.g. the BHBH DCO_formation events that have experienced at least one pulsational pair-instability supernova but have not undergone and unstable mass transfer.